AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Low-Resource Training

# Low-Resource Training

Nanovlm 222M
Apache-2.0
nanoVLM is an ultra-minimalist lightweight vision-language model (VLM) designed for efficient training and experimentation.
Image-to-Text
N
lusxvr
2,441
73
Multilingual ModernBert Base Preview
MIT
A multilingual BERT model developed by the Algomatic team, supporting mask-filling tasks with an 8192 context length and a vocabulary of 151,680.
Large Language Model Safetensors
M
makiart
60
4
Gugugo Koen 7B V1.1
Apache-2.0
Gugugo-koen-7B-V1.1 is a Korean-English translation model based on Llama-2-ko-7b, specifically designed for high-quality translation tasks between Korean and English.
Machine Translation Transformers Supports Multiple Languages
G
squarelike
94
17
Sd Onepiece Diffusers4
Apache-2.0
Stable Diffusion model trained using the Diffusers library, optimized with a One Piece anime-related dataset
Image Generation TensorBoard English
S
YaYaB
18
11
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase